73 research outputs found

    Development of Algorithm for Calculating Data Packet Transmission Delay in Software-Defined Networks

    Get PDF
    The relevance of this type of network is associated with the development and improvement of protocols, methods, and tools to verify routing policies and algorithmic models describing various aspects of SDN, which determined the purpose of this study. The main purpose of this work is to develop specialized methods to estimate the maximum end-to-end delay during packet transmission using SDN infrastructure. The methods of network calculus theory are used to build a model for estimating the maximum transmission delay of a data packet. The basis for this theory is obtaining deterministic evaluations by analyzing the best and worst-case scenarios for individual parts of the network and then optimally combining the best ones. It was found that the developed method of theoretical evaluation demonstrates high accuracy. Consequently, it is shown that the developed algorithm can estimate SND performance. It is possible to conclude the configuration optimality of elements in the network by comparing the different possible configurations. Furthermore, the proposed algorithm for calculating the upper estimate for packet transmission delay can reduce network maintenance costs by detecting inconsistencies between network equipment settings and requirements. The scientific novelty of these results is that it became possible to calculate the achievable upper data delay in polynomial time even in the case of arbitrary tree topologies, but not only when the network handlers are located in tandem. Doi: 10.28991/ESJ-2022-06-05-010 Full Text: PD

    Forming the Architecture of a Multi-Layered Model of Physical Data Storage for Complex Telemedicine Systems

    Get PDF
    The relevance of this research is determined by the need to study the issues of improving data storage technologies for complex telemedicine systems. The objective is to create a multi-layered data storage model for complex telemedicine systems to ensure the most complete use of their capacity and the timely expansion of existing storage. The research is conducted on the basis of an analysis of existing opportunities and problems in the field of data storage technologies. An analysis of the main features of the development of data storage technologies revealed that the existing models have no detailed description of the recording and physical storage of data bits, which is necessary for describing the storage process. Different architectures are reviewed, and their strengths and weaknesses are discussed. Within the framework of a demonstration experiment using the Kohonen neural network apparatus as a tool for solving the problem of placing objects in accordance with the required parameters, it is shown that the proposed storage system resource management model is operable and allows solving the problem of rational use of physical resources. As a result, a multilevel model of data storage is proposed, which combines the levels of storage process organization and technology. The distinguishing feature of this method is the comparison of storage organization levels, data media, and characteristics of physical storage and stored files. Doi: 10.28991/HIJ-2023-04-04-09 Full Text: PD

    The Isotope Effect in Superconductors

    Full text link
    We review some aspects of the isotope effect (IE) in superconductors. Our focus is on the influence of factors not related to the pairing mechanism. After summarizing the main results obtained for conventional superconductors, we review the effect of magnetic impurities, the proximity effect and non-adiabaticity on the value of the isotope coefficient (IC). We discuss the isotope effect of TcT_c and of the penetration depth δ\delta. The theory is applied to conventional and high-TcT_c superconductors. Experimental results obtained for YBa2_2Cu3_3O7δ_{7-\delta} related materials (Zn and Pr-substituted as well as oxygen-depleted systems) and for La2x_{2-x}Srx_xCuO4_4 are discussed.Comment: 31 pages, 10 figures. Review article to appear in "Pair Correlation in Many Fermions Systems", Plenum Press 199

    Development of an Algorithm for Multicriteria Optimization of Deep Learning Neural Networks

    Get PDF
    Nowadays, machine learning methods are actively used to process big data. A promising direction is neural networks, in which structure optimization occurs on the principles of self-configuration. Genetic algorithms are applied to solve this nontrivial problem. Most multicriteria evolutionary algorithms use a procedure known as non-dominant sorting to rank decisions. However, the efficiency of procedures for adding points and updating rank values in non-dominated sorting (incremental non-dominated sorting) remains low. In this regard, this research improves the performance of these algorithms, including the condition of an asynchronous calculation of the fitness of individuals. The relevance of the research is determined by the fact that although many scholars and specialists have studied the self-tuning of neural networks, they have not yet proposed a comprehensive solution to this problem. In particular, algorithms for efficient non-dominated sorting under conditions of incremental and asynchronous updates when using evolutionary methods of multicriteria optimization have not been fully developed to date. To achieve this goal, a hybrid co-evolutionary algorithm was developed that significantly outperforms all algorithms included in it, including error-back propagation and genetic algorithms that operate separately. The novelty of the obtained results lies in the fact that the developed algorithms have minimal asymptotic complexity. The practical value of the developed algorithms is associated with the fact that they make it possible to solve applied problems of increased complexity in a practically acceptable time. Doi: 10.28991/HIJ-2023-04-01-011 Full Text: PD

    A Fuzzy Approach to the Synthesis of Cognitive Maps for Modeling Decision Making in Complex Systems

    Get PDF
    The object of this study is fuzzy cognitive modeling as a means of studying semistructured socio-economic systems. The features of constructing cognitive maps, providing the ability to choose management decisions in complex semistructured socio-economic systems, are described. It is shown that further improvement of technologies necessary for developing decision support systems and their practical use is still relevant. This work aimed to improve the accuracy of cognitive modeling of semistructured systems based on a fuzzy cognitive map of structuring nonformalized situations (MSNS) with the evaluation of root-mean-square error (RMSE) and mean average squared error (MASE) coefficients. In order to achieve the goal, the following main methods were used: systems analysis methods, fuzzy logic and fuzzy sets theory postulates, theory of integral wavelet transform, correlation and autocorrelation analyses. As a result, a new methodology for constructing MSNS was proposed—a map of structuring nonformalized situations that combines the positive properties of previous fuzzy cognitive maps. The solution of modeling problems based on this methodology should increase the reliability and quality of analysis and modeling of semistructured systems and processes under uncertainty. The analysis using open datasets proved that compared to the classical ARIMA, SVR, MLP, and Fuzzy time series models, our proposed model provides better performance in terms of MASE and RMSE metrics, which confirms its advantage. Thus, it is advisable to use our proposed algorithm in the future as a mathematical basis for developing software tools for the analysis and modeling of problems in semistructured systems and processes. Doi: 10.28991/ESJ-2022-06-02-012 Full Text: PD

    Cluster Data Analysis with a Fuzzy Equivalence Relation to Substantiate a Medical Diagnosis

    Get PDF
    This study aims to develop a methodology for the justification of medical diagnostic decisions based on the clustering of large volumes of statistical information stored in decision support systems. This aim is relevant since the analyzed medical data are often incomplete and inaccurate, negatively affecting the correctness of medical diagnosis and the subsequent choice of the most effective treatment actions. Clustering is an effective mathematical tool for selecting useful information under conditions of initial data uncertainty. The analysis showed that the most appropriate algorithm to solve the problem is based on fuzzy clustering and fuzzy equivalence relation. The methods of the present study are based on the use of this algorithm forming the technique of analyzing large volumes of medical data due to prepare a rationale for making medical diagnostic decisions. The proposed methodology involves the sequential implementation of the following procedures: preliminary data preparation, selecting the purpose of cluster data analysis, determining the form of results presentation, data normalization, selection of criteria for assessing the quality of the solution, application of fuzzy data clustering, evaluation of the sample, results and their use in further work. Fuzzy clustering quality evaluation criteria include partition coefficient, entropy separation criterion, separation efficiency ratio, and cluster power criterion. The novelty of the results of this article is related to the fact that the proposed methodology makes it possible to work with clusters of arbitrary shape and missing centers, which is impossible when using universal algorithms. Doi: 10.28991/esj-2021-01305 Full Text: PD

    Learning effective amino acid interactions through iterative stochastic techniques

    Full text link
    The prediction of the three-dimensional structures of the native state of proteins from the sequences of their amino acids is one of the most important challenges in molecular biology. An essential ingredient to solve this problem within coarse-grained models is the task of deducing effective interaction potentials between the amino acids. Over the years several techniques have been developed to extract potentials that are able to discriminate satisfactorily between the native and non-native folds of a pre-assigned protein sequence. In general, when these potentials are used in actual dynamical folding simulations, they lead to a drift of the native structure outside the quasi-native basin. In this study, we present and validate an approach to overcome this difficulty. By exploiting several numerical and analytical tools we set up a rigorous iterative scheme to extract potentials satisfying a pre-requisite of any viable potential: the stabilization of proteins within their native basin (less than 3-4 \AA cRMS). The scheme is flexible and is demonstrated to be applicable to a variety of parametrizations of the energy function and provides, in each case, the optimal potentials.Comment: Revtex 17 pages, 10 eps figures. Proteins: Structure, Function and Genetics (in press

    Uncovering novel mutational signatures by de novo extraction with SigProfilerExtractor

    Get PDF
    Mutational signature analysis is commonly performed in cancer genomic studies. Here, we present SigProfilerExtractor, an automated tool for de novo extraction of mutational signatures, and benchmark it against another 13 bioinformatics tools by using 34 scenarios encompassing 2,500 simulated signatures found in 60,000 synthetic genomes and 20,000 synthetic exomes. For simulations with 5% noise, reflecting high-quality datasets, SigProfilerExtractor outperforms other approaches by elucidating between 20% and 50% more true-positive signatures while yielding 5-fold less false-positive signatures. Applying SigProfilerExtractor to 4,643 whole-genome- and 19,184 whole-exome-sequenced cancers reveals four novel signatures. Two of the signatures are confirmed in independent cohorts, and one of these signatures is associated with tobacco smoking. In summary, this report provides a reference tool for analysis of mutational signatures, a comprehensive benchmarking of bioinformatics tools for extracting signatures, and several novel mutational signatures, including one putatively attributed to direct tobacco smoking mutagenesis in bladder tissues

    Adaptation of the Landau-Migdal Quasiparticle Pattern to Strongly Correlated Fermi Systems

    Full text link
    A quasiparticle pattern advanced in Landau's first article on Fermi liquid theory is adapted to elucidate the properties of a class of strongly correlated Fermi systems characterized by a Lifshitz phase diagram featuring a quantum critical point (QCP) where the density of states diverges. The necessary condition for stability of the Landau Fermi Liquid state is shown to break down in such systems, triggering a cascade of topological phase transitions that lead, without symmetry violation, to states with multi-connected Fermi surfaces. The end point of this evolution is found to be an exceptional state whose spectrum of single-particle excitations exhibits a completely flat portion at zero temperature. Analysis of the evolution of the temperature dependence of the single-particle spectrum yields results that provide a natural explanation of classical behavior of this class of Fermi systems in the QCP region.Comment: 26 pages, 14 figures. Dedicated to 100th anniversary of A.B.Migdal birthda
    corecore